Skip to main content

tv   Author Discussion on Big Data  CSPAN  October 7, 2017 8:01am-9:02am EDT

8:01 am
davidson founding director of the futures initiative at the city university of new york and author of the new education. also this weekend, nobel prize-winning economists talks about how to solve the problems of global poverty, unemployment and climate change. .. >> television for serious readers. for a complete television schedule, visit booktv.org. now we kick off the weekend with
8:02 am
a discussion on big data from the brooklyn book festival with authors cathy to kneel and tim wu -- cathy to kneel and tim wu. >> good morning. good morning, everyone, welcome. i'm a professor at the nyu stern school of business, author of the forthcoming book, "the person you mean to be." i'm very excited about our paneled today. this morning we're going to look at how big data has become an unavoidable part of our world and what are the dangers and benefits, what does et say about us, ourer fierce, our dreams and what should be thinking about and aware of if we're going to
8:03 am
go down this path. we have a terrific panel, mathematician cathy o'neill and columbia hall's tim -- law school's tim wu are with us today. very importantly, you want to know that i promise you these are books you're going to want to purchase -- [laughter] >> thank you. >> and i promise you these are books you're going to want to purchase, and if you want to do that, when you leave the buildingan and turn left, findig table eight manned by barnes ba& noble, you can get copies signed by the authorities after this panel. we will have timee for questions later, so please keep those in mind until we get to the q&a period. we're going to begin first with cathy o'neil, mathematician, former allege fund client -- hedge fund client, author bloomberg view columnist. she wrote "weapons of mass destruction: how big data increases inequality and
8:04 am
threatens democracy." cathy, we'll begin with you. tell us about your book. >> great. thanks, dolly, and thanks, everyone, for coming. this is super exciting. my favorite part is talking with people about the questions that come up around big data, and there's been a doozy of a week or two weeks in the last two weeks.f wow, we've got a lot to talk about. facebook ads, algorithms, political ads. so my book is about, you know, sort of the algorithms that we need to worry about. i call the worst algorithms weapons of math destruction. and to be honest, six years ago when i decided to write this book with, i was a data scientist making algorithms. i was deciding who got what option on the internet. so i was deciding who would get this offer, who wouldn't get this offer, and i was doing it -- by the way, having been in
8:05 am
quantum finance during the crisis, i already knew firsthand what could go wrong with algorithms. [laughter] and it burned me, right? we were working in finance, the aaa ratings and mortgage-backed securities were mathematical lies, but they were believable lices, and they took in a lot -- lies, and they took in a lot of people. the machine of the mortgages kept going in large part because of those aaa ratings. and it was an abuse of mathematical trust, and i didn't appreciate that. so i actually left finance hoping to do a better job. i worked in data science after joining occupy, and i soon realized that on the one hand i was doing almost exactly the same thing, so is i was -- instead of predicting the futures markets with my statistical algorithms, i was predicting human being actions, right? but i was also potentially doing something justti as destructives i had seen happen in finance;
8:06 am
namely, i was choosing the winners and the losers. i realized that every data scientist was doing exactly that,, and we were doing that based on things like do you have a mac or do you have a pc, are you using chrome? are you using firefox? are you a high value customer or a low value customer. and we were deciding to make better offers, essentially, to people that looked like they had more money. and that was my perspective. i wanted to i think otherwise, right? i wanted to think, oh, i'm doing, i'm doing something not as, like, more benign than finance, but the more i thought about it, the more i realized that we were very deliberately creating exactly the same kind of, like, social structures and silos on the internet that we had been trying to escape when we first created the internet. remember when we thought the
8:07 am
internet was a democratizing force where everybody would have equal access tode information? well, that wasn't what i was building. so for the second time i realized that i was becoming complicit in something that was really evil. but it was actually, in my opinion, possibly worse than what had already happened in finance. whereas in finance everyone noticed when the financial crisis happened, in this new system where we were all pushing the lucky people up and pushing the unlucky people down, it was pretty much invisible to people being pushed or nudged in either direction. and in particular, the people unfairly not getting the opportunity they should have gotten would never know they had ever been partd of an algorith, had ever been scored by an algorithm, and they'd be invisibly pushed downward. so, in other words, failures in finance were obvious to everyone. failureso in data science were
8:08 am
subtle but nobody would notice, nobody would fix it. that's not fixed finance, but at least we know what the problems are. i got down to business and it was helped along by my friend carrie in the audience today, started telling about teachers, she's got a high school around the corner. they're being scored by a mysterious secret scoring system value added model for teachers. they didn't know how the scores were built, but they knew if they didn't get good scores they wouldn't get tenure. i looked into this-- i found out that this scoring system was horrible. some teachers got a 6 one year, 96 the next year, very inconsistent scoring system, even though these teachers weren't changing the way they taught and i looked into it more, teachers are getting fired in washington d.c., more than 200 in a single year were fired because of this system. it was an arbitrary secret system.
8:09 am
then i started thinking, of the commonalties, between the different terrible algorithms that i was seeing and i can tell you a lot more in the next few minutes, having to do with credit scoring and getting insurance, having to do with trying to get a job, personality tests. i'm sure a lot of you have taken personality tests. secret algorithms may keep you from getting a job, you can't complain if it's wrong because you don't know how it works. powerful and unfair, i would add. people consistently and constantly getting rejected for no reason and there's no appeals process. i would see this in criminal justice. they would use the system to decide how long a person would go to prison. it wasn't a deep look into the people's characters. these were things that were demograph e demogra demographic, by which i mean, where did you grow up?
8:10 am
did you grow up in a high crime area is one of the sentencing criteria, i call them weapons of math destruction, they're secret, powerful and harmful, destructive. not only on an individual level secretly because people didn't know about it, didn't understand it, but they were also creating these terrible negative feedback loops on society, instead of getting rid of bad teachers with the value added model. they were getting rid of teachers who didn't want to work in that system. and the final word, i want to provoke and ask questions about any other things i've said. the final thing i'm saying, the larger look at it, it's increasing in equality. now, i should have started this by saying, algorithms are constantly foisted upon us and described as objective, as unbiased, as if they're going to improve the world, right? inherently, because they're mathematical algorithms.
8:11 am
that's not a fact. there's nothing inherently fair or objective about the algorithms. what i was seeing when you add it up at every juncture of our life, pushing us down or up, this was a-- the cumulative effect of this algorithmic pushing and nudging was the opposite of social mobility, right? the opposite of the american dream. and the rest of our life and we're going to keep you where you were. if that means you were born in a poor minority neighborhood. you're on the lower end of every single scale. and you're born to a prestigious neighborhood, that means you're going to be considering a good bet in every situation. so far from being the objective marketing cools that we think of them as, algorithms have the potential, the bad ones have the potential to do real harm to our society and i'll finish
8:12 am
by saying, i'm not anti-algorithm, but we do have to do a lot better. >> thank you so much, cathy, secret, powerful and destructive how you're describing the algorithms that rule our life. in your book you take us through, i think you describe its a the journey of virtual lies, every domain and my sentence in reading this book, there was no where to run. every chapter unveiled another algorithm in some way was affecting my life directly. when we get to the q & a we'll want to hear more about that. i realize that the books are sitting back there. we are live on c-span, with permission i'm going out of frame to get the books so you guys can see them and introduce th them. >> all right. [laughter] >> you're going to love hearing about both of them. >> and mine? >> you want yours?
8:13 am
>> sure. >> why not? >> a long time getting this cover. >> a good cover. >> thank you. >> oh, really, i want to tell you about tim and then we're going to bring these two books together. tim is a professor at columbia law school, contributing opinion writer for the new york times, he's best known for his work on net neutrality theory, he in fact coined the term net neutrality. he's the author of the book "the master switch", the attention merchants along with network neutrality, broadband discrimination and several other works. one of america's 100 most influential lawyers in 2013, named to the american academy of arts and sciences in 2017. author of the attention merchants. tell us about this book. >> sure, thanks, what a great audience, i'm just really pleased. i was worried it was going to be big data audience and you know, oh, from the 7th floor
8:14 am
nobody is coming up. but-- >> the best audience. >> yes, this is great. and i think this is kind of like an internet hangover panel, or hang-- you know what i mean? i think there's to both of our conversations, that i don't know about you, but the early 2000's, like '90s, we saw a liberating promise from tech, the web and the internet and thought all the things we'd struggled with before were going to be over. algorithms are going to solve our life problems, free stuff from google or facebook or make us better friends with people or find anything we want and he think we're kind of at a point in history where people are sort of like, what happened? like the party went sour. like the counter culture in the '70s at some point, like kind of, people are picking up, what happened to the big dream? and i think, i hope that we're sort of still optimistic a
8:15 am
little about tech, but we're here, i think, to deliver and sort of try to turn the ship back towards serving humanity, which would be my aspiration. so, this book is -- i like to write grand sweeping historical epics how i like to think of them. it's part of a trilogy. the first one was "the master switch." this book, it's related to big data, but the general topic of this book is the rise of human attention at an essential resource in western societies. and the rise of an industry that harvests attention and resells it. it's a story that starts here in new york city, actually in manhattan, not here, but probably brooklyn as well. with the first ad-supported newspapers and runs all the way through from the conquest of
8:16 am
the very strange business model, it's a model where you get to know a lot about people, they didn't know much in the 19th century, but know something. accumulate a giant audience like this one and then resell their attention to somebody else. you know, we live with it every day. almost all the stuff we use on the computer feels like it's free and that's because in fact, you're selling your data and your attention. and so i wanted to understand where that came from, how this very obscure weird business matter used to only power tabloid papers, the new york sun being the first, spread to the entire economy, and the point that i think is interesting for today's discussion is a moment in around the year 2000 when there was a start-up named google that, you know, like a lot of startups had a great product, starting to gain some traction, but didn't have any business model.
8:17 am
they were losing money. you know, like startups do. and you know, word was going around, this thing is great. and you know, they were like how are we going to make money. now, it start of seems obvious in retrospect that they turned to advertising, the funny thing about google. they always had a-- especially larry page had this intrinsic disgust and hatred for advertising and larry page had written sort of anti-advertising manifesto for those of you who know the famous paper he wrote, describing the google algorithm and in the appendix wrote a screed against advertising, any advertising-funded search engine is always going to be manipulative, always turned against the interests of users so that was, you know, google's original position. obviously, they changed, you know? huge piles of money have a way of changing one's mind about
8:18 am
things, i guess. [laughter] and so they, they adopted the model. and then later on it almost seemed natural they would adopt the model. this much access to their minds, and we can in ways subtle and not subtle kind of make them to what we want or at least shape what we want. going down that path, i think, was the path of darkness. and, you know, i understand publishers, there's a lot of good reasons for advertising, but i think the extent to which
8:19 am
the web has become dependent on advertising has done,s has resulted in it reaching over the last year or two what i would consider a rock bottom where, you know, the web used to be kind of exciting, full of stuff. today it's become a vast wasteland with a couple of exceptions. i mean, some of that stuff's fun, but overall the sort of original promise has gotten lost, and irr think a lot of tht has to do with the nappedz of this business model -- demands of this business model to deliver up the page views, deliver up the clicks. it's become a giant manipulation machine, that's its only business model. people in theti '50s and '60s started to say, oh, man, the demand for ratings has ruined television. but i would say the contest for clicks makes the contest for ratings look dignified in comparison. [laughter] so i'm here to say, you know, i am at some level an optimist, i don't really sound like one
8:20 am
right now. [laughter] but iso am at some level. there are these moments where media reset themselves. television got better, you know, when i went mainly to a paid model or faced new competition from the web. i think we can rebuild the web, i think we can do b i think we need the sort of go back, look at the sites that have preserved what's good about them. wikipedia's a good example. wikipedia somehow has managed, you know, it's not perfect. occasionally there'll be entries on comic book characters which are longer than former presidents or -- [laughter] you know, things like that. but, you know, system ground rules have really helped wikipedia, and i'll add a political side to this. youu notice wikipedia hasn't had a fake news problem. one of the upshots of this business model which just focuses on giving people what way about they want to see, maxg the hours you spend on facebook or so forth is feeding people
8:21 am
exactly what they want. it powers the filter bubble business model which is another book actually, filter bubble. how is that business model, the incessant demand that you give people exactly what they want to see has pushed us very far towards the polarized politics of our current time. you know, everyone knows this, but the right hears exactly what they want to hear, the left hears exactly what they want to hear. it feeds upon itself and creates a vicious and insane kind of politics. i think the contest retention has a lot toit do underlyingingy when you understand it with some of the most successful leadersed the, including our president. it's almost like he's president buzzfeed. it doesn't matter, doesn't matter whether it's good or bad, what are people watching, who wins the ratings at the end of every day. that has infected politics, this book tells you where. all right. >> all right, b love it. [applause] >> thanks. >> i'd like to say you have
8:22 am
around internet hangover as a way to frame this. usually, tim and cathy, when you have a hangover, it was preceded by a good time -- >> yeah. >> -- at some point. [laughter] you know, sort of -- and so was there a point where we were at a party that was going well, or has this been a hangover from the beginning? >> i can -- i think so, yeah. i think the early 2000s. and you can differ. maybe you were in finance, it was a different scene. but i think there was an extraordinary moment in early 2000, kind of chronicle it right here, where there was such a sense of possibility associated with the web, and maybe there still is to some degree. the idea that, you know, everyone would be free to be a publisher and sort of have their views out there. the birth of blogging, the first forums, the hobbyist sites, you know, where you could -- i mean, i have a lot of weird hobbies.
8:23 am
>> like what? [laughter] >> how long you got? >> give us one. >> well, let's say i was really into old motorcycles, vintage honda motorcycles for a while. >> cool. >> because i drove one when i was a little kid in asia. so you could find a site where they had every part identified. you know, it was like geek paradise. i also love -- this is still around -- i love the fact that you can watch a tv show and have a thousand people dissect exactly what does everything mean. but that's a little more recent. there was this moment, and it was going to fix democracy, it was going to fix everything that was wrong with the mass media. oh, i'm so sick of just being given, like, the stupidity of -- everything's going to get better, and it's going to be led by all of us. and that was a remarkable time. i guess there was a little bit of premature triumphalism. this book has an awful lot on the '60s where people were sort of in around 1969, '70 were planning, like, what's the new order about to become only to
8:24 am
realize they were actually at the height of it, not at the beginning of something bigger. >> okay. >> sorry, i'm going on a little too long, but i think there s ws thisi' real moment, and the grad failure was the failure to institutionalize it. just kind of assumed, well, you know, it's different, the old rules don't apply, didn't do anything to kind of bottle it. >> okay. >> the exception of wikipedia which set up a lot of rules. and i'll say also everyone at silicon valley, google, a lot of companies, a lot of well-meaning people said, oh, there's no -- we're great. we can take a standard corporate for-profit model, and that won't affect us at all, you know? we'll still be a do-good kind of place, but we'll just be reporting to shareholders now and then, no big deal. and everyone kind of jumped on that. there was a moment where you thought you could have your cake and eat it too. >> stayed too long at, the pary situation. >> and now it's the walk of shame. [laughter]
8:25 am
2007, right? so, believe it or not, i mean, and i didn't know anything about finance, i didn't know-- i was a nerd, a math nerd, not history-- i don't know history. i only now what i saw when i got there which was a bunch of very, very smug rich people. and i would ask them questions like, what if liquidity isn't infini infinite. cathy, there's always liquidity, always. and then the crisis ensued. it was actually earlier inside than outside. it started in august of 2007. for the rest of the world it started a year later, but everybody was just like the s-their pants, and they were like wow. by the time i left in 2011. i spent four years, two years
8:26 am
at a hedge fund and two years trying to double down and how that ended their failure. it wasn't a math problem, it was-- people in finance had been chasened. we are going to stay here as long as we can and get here as long as we can. >> and this is broadcast. >> sorry about that. and then we went into data finance and it was like in new york, it wasn't silicon valley, it wasn't the center of the beast, but it would be date a sciency stuff and back in time to 2007 in finance, everybody was smug. everybody was like, we are doing good because we are making money. i mean, literally, that was the kind of reasoning that was being held. if we're doing good on the internet with fancy data models then we must, and making lots of money we must be good for the world. and it didn't go beyond that
8:27 am
and in some places it hasn't gone beyond that. i think the rest of the world is waking up that that this isn't actually particularly good for us in society, but i don't think that everyone has heard that message yet and i don't know if they will. >> so, but both of you, i think, are trying to get a message out that there's -- with your book, cathy, we barely know it's happening. by the time you work through these books, you have a sense, in tim's case of being used, because the book is how our mind share, our attention, our gaze is being sold and resold, and i'm noticing in this room, this is, i think, an ad-free space, tim, other than-- >> i'm not that hard core about it. >> but in your book you talk about how few ad free spaces there are in our minds and lives.
8:28 am
cathy, in your book we talk what we barely know what is happening is how the algorithms are shaping who is compete are for our bandwidth and whose students are being opened versus closed. so-- >> i jumped in there, one of the reasons i decided to write this book is that i was literally the only person-- like i mentioned the only person that was worried about this. i was seeing it, oh, my god, things are happening again, but no one cares. the reason i thought no one cared was because as a white, highly educated, data scientist working in new york, with the you know, good pocket money, i was never the victim. >> right. >> i was creating a system that-- that made some people suffer and gave option, good options and opportunities to people. and none of the people building those systems were the victims. so. >> that's interesting, i want to ask you about that.
8:29 am
so the algorithms, the darkness and destructionism of algorithms that some people benefit and some suffer. or is that all of us benefit in some ways and all of us suffer in some ways? >> tim is talking about something more larger, more m meta and more diffuse when he talks about attention. and we can talk about the facebook algorithm. but i want to focus on the algorithms that are about power. straightup power and like corporations who have minimum wage jobs, which a lot of people work at these companies, right? they don't want to actually interview everyone because it's a lot of money so they give everyone personality tests and get rid of the 90% of the applicants this way which is for them, great. as long as the last 10% of people are great workers. it's possible that 40% of
8:30 am
people were rejected, were rejected for no good reeb. this is not on-line. you're applying for the job tfrments you're applying on-line. it's not in particular about privacy because you cannot say, no, i will not answer these questions when you're applying for a job. for that matter, no school teacher, public school teacher can say, no, i refuse to be measured by this algorithm. that's not allowed. so many of the algorithms i talk about, they're know the something you can opt out of. and rich people typically don't have to go through this at all. people who are truly elite, like i am, i don't have to go take a personality test when i try to get a job because i'm going to get interviewed by a senior person at the company. you see what i mean? the data science in building
8:31 am
these things, do not have to undergo them. the people who deploy them, make them. like hire the data scientists, they're not subject to them. so it's really a power thing and we are more and more being sized up and divvied apart through corporate hour via these algorithms. i think that's the best way to be thinking about it. >> is it fair to say that what's happening in an algorithm, you tell us a movie story in the book about a young man applying for a job and not getting the interviews. he's qualified. >> qualified. >> he's going for jobs in a grocery store, technically overqualified for and he's not getting interviews. what i hear in that kind of algorithm and the others you're describing, that people cannot
8:32 am
opt out of is that there's a predictive quality. some data somewhere where there's a corps ration between x and y. and therefore, it's decided in this, if x exists it will lead to y and therefore we're going to pull out anybody with the predi predict-- predictiveness, we're making judgments on individuals based on characteristics. we call them stereo types. >> you can call them stereotype machines. people are told that algorithms are fair because they're following the numbers. guess what, the numbers mean the data and it's a reflection of our society and our society is racist, our society a classist, and. [applause] . >> guest: thank you or no thank
8:33 am
you. when you follow the data you're propagating the status quo. i would say you're exacerbating the status quo. the truth is as a society we're trying to evolve, we're trying to evolve, and if we are following these-- blind my following these algorithms, we're doing the opposite. we're saying, we're trusting these things that are keeping us back. the first step is for us all to get it. don't tell me my score was 20 out of 100 and i need to get fired. explain to me how i was assessed. one piece of good news because i don't want to be a downer. in houston, a few months ago, teachers who got fired because of these scores. they sued and the judge agreed that their due process was violated. it's not proper to be assessed
8:34 am
by an algorithm and fired for it. we're seeing this, just the beginning, because so many people are intimidated and they trust mathematics. by the way, mathematics is trustworthy and this isn't mathematics. as a person who builds algorithms, i make many, many subjective choices when i build an algorithm, it's not math and i flow in the data, which which i said-- >> and i know about the mathematical elements of algorithm, but-- >> you have a great line in your book where you say predictive models are fundamentally moral the you push us to think through a norther northerly-- a moral lens. there's no getting around the question of is this fair, it's
8:35 am
not fair. there is no way that an algorithm can be objective, as long as we're talking about people and whether they're good at their job and smart enough and qualified, that's not an objective question, it's a subjective question. we have to decide what fairness looks like and instead of trusting that the past history, the past practice was good enough. we have to force the algorithm to bend to our will in that sense. >> thank you, cathy. tim, i'm going to ask you one more question and i know i'm monopolizing this. one question we're coming to you. be ready. tim, i want to ask you about rituals. you talk in your book with a few different kinds of rituals and i'll let you elaborate. >> sure. >> this book, as i said, is a history of the attention harvest, so it goes through a
8:36 am
lot of time and characters and things happening and one of the most important things how people spend their time and attention are rituals and you know, for example, just this very idea of many of you may have this idea you need to read the news every day. you know, that didn't always exist. newspapers, of course, didn't exist, but even that idea i need to sort of catch up on the tuesday. that is a, in some ways an invention, a bad one in some ways, invention that we'll know the same. and that originated in the coffee houses in england in the 18th century. another important rid -- ritual is thing called prime time. we all kind of know that word, but the idea of the entire nation, let's take the year 1953 or so, sunday night, 8 p.m., the entire nation sits
8:37 am
down basically to watch the ed sullivan show, all at the same time, everybody, focused on one speaker. it's extraordinary, nothing like that had happened before in human history and it may never happen gep with the occasion of the super bowl. now it's a big event when people watch the same thing and it's the entire nation and it's unusual. the idea that after dinner everybody will sit there and focus attention to a couple of hours of tv. monday, ""i love lucy," every day there was one. and prime time, it's a strong thing and some people don't do it. there was that occupied art, i think, really has become the ritual that's built current web and structured our lives. that you need to check your e-mail and see what happened on
8:38 am
facebook, or twitter and go on snap or instagram and depending who you are, and so it again. and maybe you'll itch to grab your phone, that's essentially attention capture. if you go to a site ones, that's it. it dives into the dark-- essentially of creating addictive product. to creating things that both kind of keep you hooked and somehow can deliver the promise of manipulation, i mate to use that word, but it is what it is. i think this is where our books du actually linked. one of the things that happened the last 10, 15 years, as happens. we're ignoring and avoiding advertising, and either by zipping through it or developing that weird kind of blindness where you don't see
8:39 am
it. your brain has a-- it's methods of manipulating people who don't know they're being manipulated. and you find yourself, why am i buying it? maybe at some point it was subtly recommended to me, i don't know. i saw time experimenting with myself. why do i start to believe things politically what's going on, whatever it is. i don't like to be conspiracy theorists, but there are talented data mines that looks at money. don't think you can outsmart them. that's our problem, we always think, you know, we're vain creatures and i'm smarter than the algorithms.
8:40 am
the advertisements, a lot of people say advertisements don't work on me, i'm special. >> thank you, we're going to open it up to questions. the mics are there. please stand up. >> i hope this isn't a dumb question, but what's an algorithm? >> oh, thank you, thank you, a great question. [applaus [applause]. >> good question, i'm sorry i forgot. i only had ten minutes. i'm glad-- i'm holding you back. and algorithm is something that you do, every person does in their own head, where they use prior information to predict something. the exam i like to give ap this is a great example because it shows how subjective it is.
8:41 am
i make dinner for my family. i cook dinner for my family. the data i need on a given night are the ingredients in my kitchen. by the way, i cureate data. and i don't use everything in the kitchen, i don't use the packages of ramen noodles, and i'm curating my information. and we make decisions. you have dinner together and assess was that an access? the data could be what does success look like? for me it's like my kids ate vegetable. my eight-year-old, did he get to eat nutella and that matters. overtime we optimize to success. the next day i've learned that
8:42 am
it wasn't successful, too much nutella and not enough brooklyn. over time the secret of dinner i cook depends on success. facebook optimizes its algorithm, the news feed algorithm to keep you on facebook, to optimize its profit. what else, what would be the nutella version of that. how about you optimize it to giving us true facts, different algorithm. because it's a different definition, success. does that make sense? >> we all do ago rim -- algorithms. >> our next question is from here and the third all the way in the book. [inaudible]
8:43 am
. can you hear me? >> i can hear you. >> talking about big data and collections and the way that guests can be steered politically. tim wu in your book "the attention merchants" you talk about the decades old program, and the way that-- the different. but it's gotten much more granular with big data and this just comes into everything from gerrymanderi gerrymandering, and the haters were in the news this week. >> great, thank you. >> a great question. sound like you already read the book, which thank you for that. and so in the eastern block during communist times, there would be a folder on every citizen of note that would have, you know, a bunch of
8:44 am
everything and thought that's crazier than the ever. and the folders held for us are far more detailed and informative of every aspects of our lives and anything-- people, the former convention so software, and google notably has more information and it's collected voluntarily. when i first signed up with facebook, i'll tell them everything about myself. i'm not sure why i did that. [laughter], but, oh, because then my friends will find me, something like that. we've kind of willingly handed over a massive, at least multi-million transfer of data over the last ten years. oddly enough, facebook has
8:45 am
never paid us for that. if you mention data for the business model, the more you know, the-- people with a gambling problem, giving them casinos. and there are subtle things you know about people are very good first for advertisements, but hoding their attention. this is the politics. if you know someone is progressive, he this want a story every day about what a bastard trump is, and you want to hear it, and you just want it. if you're a conspiracy theorist, you want a con spirty theory every day, you want certain things and they will be delivered to you over and over again. and that's where attention and data meet. >> thank you. he's got it? perfect, thanks. >> i'm a biostatistician and i feel like there's a constant
8:46 am
dialog, conflict at times, between the statisticians at times and those who are kind of doing big data and data scientists. and there's always this idea shall well, those who are doing algorithms and so forth are not using causal interests, like that. when you were talking about earlier, kind of the oppressive or the subjective way in which we analyze thing. i wanted to know if you could talk more. even though you see them arguing for more in big data. you see, for example, the causal inference or hack thereof is being used for that and kind of reinforcing the discrimination in itself. and so, it's a very interesting
8:47 am
discourse to see how those are caig that big data doesn't have causal inference and they're both starting with the subjective discrimination that one has. apso, i wanted to know if you could maybe talk more about that. >> cathy, do you want to take this? i'm going to dumb this down. experiment-wise. i think this is the best peerme peerment-- experiment i've come up with. fox news, imagine to fox news replaces with a machine algorithm. you're experts in algorithms thanks to that man who asked me what an algorithm is. any science would duuse the mos relevant and they'd have to define success, what is a person who is accessful at fox
8:48 am
news in the past look like. what does it take. standard answer is, someone who stayed for a long time. promoted, somebody who has gotten lots of raises. makes success, and then train the algorithm who find people who would be successful at fox news, and i chose fox news for a reason, we happen to know that women were systematically and african-americans were systematically discriminated against, they were not allowed to succeed in that culture, right, just as a thought experiment. now, imagine training that algorithm and applying it to a new pool of applicants, right? what would happen? it would systematically remove women and people, and african-americans, because it would say those people don't look like people who were successful in the past.
8:49 am
does that make sense. is that causal? >> causality is a question why-- that would be requesting asking the question why weren't they successful? was it because they were a bad worker or the culture of fox news didn't allow for them to be successful. that question isn't asked by an algorithm unless you asked that question of an algorithm or another way of thinking causality, why are we choosing this would be success? all of things prone to implicit bias problems, maybe we should instead ask the question are you qualified to work here? if you're qualified, why aren't you staying longer and saying, what's wrong with the culture that highly qualified people aren't getting raises and promotions.
8:50 am
that's what we have to think of in causality. we're lazy, instead, we're studying this and saved you tons on hr people. hope that helps. >> thank you very much. unfortunately we're told we're at time and not going to get the remaining questions. >> one more? >> one more quick. >> first of all, i mean i-- >> short question, short >> thank you guys for all -- [inaudible] hopefully, it's -- for young people. >> yeah, sure. >> they need it most. >> okay, so i have two things to say, and i'm kind of close this out. this book has a little bit to do with, a lot to do with our lyes. at the end of your life, william james said, your life is what you actually paid attention to,
8:51 am
that's it. there's nothing else. i think you need to think of your attentionve as a resource that you spend. i think it's a great idea to say how do i spend my attention, and it taking me to the kind of life i want? writing my own book i was, like, i should read more books. because when i read books -- and i should spend less time -- do you ever have the experience you go to write an e-mail and four hours go by? it's t like a casino. you need to be aware of we're living in a casino and really be hardrd core about how you're spending it. the second thing people who are on tech in this room, and young people, you know, giving new birth to a better web, a better internet, you know, we can fix this we can save it. notow just naively thinking we n have all the technology and still have the modifications.
8:52 am
we need to change, we need to make a web and an internet that actually serves humanity, and that's a big deal, and those are my two ideas. >> i love it. thank you, guys. thank you very much. i [applause] the authors will be signing books outside, signing table h. turn right when you leave the building. [inaudible conversations] >> you're watching booktv on c-span2, television for serious readers. here's our prime time lineup. first up tonight at 7 p.m. eastern, we've got a block of authors discussing education and education reform. you'll hear from former pbs "newshour" education correspondent john marrow,
8:53 am
author of "addicted to reform." followed by "they're your kids." and kathy davidson, founding director of the futures initiative at the city university of new york and author of "the new education. decides on booktv's "after words" at 10 p.m., former radio host and msnbc contributor charles sykes discusses the conservative movement in america with fox news contributor tammy bruce. and at 11:15, kevin perino recalls the response to the takeover of china in 1949. that all happens tonight on c-span2's booktv. three days of nonfiction authors and books on this holiday weekend. >> as jack said to me memorably, jimmy carter was arguably the most intelligent president of the 20th century. katherine graham said so, tip o'neill said he was the most
8:54 am
intelligent president they'd ever -- as jack put it, he could consume amazing quantities of information and assimilate them and use them. but i was having a conversation with brent scowcroft at one point, you know, bush 41's amazing national security adviser, and he said, you know, zbig and i were talking one day, and zbig said i love this guy. i can give him a 50-page memo in the afternoon, and i get it back the next morning with notes in the margins on every page. and scowcroft looked at him and said, zbig, that's the worst thing you could possibly do. [laughter] he doesn't have time for that. jimmy carter, i think, got bogged down in the minutiae. i mean, in fairness, you know,
8:55 am
stu eisenstadt will give you chapter and verse, and i'm sure jack could do, on all the legislation that was passed early in the carter, you know, more legislation than any president since lbj. but he couldn't prioritize. you need a chief of staff to prioritize, to make sure that the narrative is consistent, make sure that everybody's on the same page. none of that is happening, clearly, in the present day. [laughter] but he suffered from not having a white house chief from day one. and in my opinion, jack would have been a great one. >> well, you know, one of the things when you start out your book, you talk about what seems like just the most logical kind of meeting in advance of an administration taking office, and that's bringing former chiefs of staff together -- and in this case it was to --
8:56 am
>> rahm emanuel, yeah. >> -- to bring him up to speed. and had most of the chiefs of staff there to give him advice. jack, you were there, what was it like? what was that meeting like? [laughter] >> it was funny. >> december 5, 2008. >> december 5, 2008, josh bolton who was the president's chief of staff, outgoing president's chief of staff, george w., had gathered this group, and there were 13 or 14 of us there, i think. sat around a table in the chief of staff's office having breakfast and talking. rahm was sitting next to me at the meeting. we just went around the table, and each one of us made a brief, very brief statement of some little piece of advice that we thought was helpful or that would give some guidance, some of it humorous, a lot of it humorous. give you an example. when it got around to dick cheney, you will remember he was
8:57 am
the vice president -- [laughter] >> and a hell of a chief of staff for gerry ford. >> he was the chief of staff that i met when we were elected. when it got around to dick who was at the end of the table, almost at the very end of all of us, dick is an interesting man. [laughter] he leaned forward like this, and he said i have one piece of advice: keep your vice president under control. [laughter] >> the other piece of advice i loved was ken duberstein's, reagan final chief. he's a great storyteller. anyway, he looked very gravely at rahm and said never forget
8:58 am
that when you open your mouth, it's not you who's speaking, but the president of the united states. to which rahm said, oh, blank. [laughter] and brought down the house. >> here's a look at some authors recently featured on booktv's "after words," our weekly author interview program. investigative journalist art levine reported on the mental health industry. new york times magazine contributor susie hanson reflected on her travels abroad and weighed in on america's global standing. and progressive policy institute senior fellow david osborne examined the charter school movement and offered his outlook for the future of public education. in the coming weeks on "after words," craig shirley will discuss the life and political career of newt gingrich. federal judge john newman will detail his career in the judicial system first as a prosecutor, and currently as a
8:59 am
federal appellate judge. former face the nation anchor bob schieffer will examine the role of the media today. and this weekend on "after words" former radio host and msnbc contributor charles sykes will provide his thoughts on the conservative movement in america. >> i'm a conservative who believes that america should be the shining city on the hill, that we are based on an idea. they reject the idea. they reject the values of the declaration of independence rather explicitly. they don't believe that america is an idea, they believe that it is a geographic allocation, people by people of certain ethnic and racial backgrounds. and there's a real dark side there. and the reality is that throughout the campaign donald trump had more than one opportunity to repudiate them, reject them, speak out against them, and he dodged and clayed and wink -- and delayed and winked again and again and again. it's a pattern. it's not -- this is not a
9:00 am
one-off, this is a problem. this is a cancer at the heart of the conservative movement if we are not willing to say that the left is not right, that we tolerate these people. so there are these moral judgments, these key moments in every movement. you know, liberalism had to expel the communists in the late 1940s. conservativism had to expel the virtues in the '60s. we have to deal with the alt-right. >> "after words" airs on booktv every saturday at 10 p.m. and sunday at 9 p.m. eastern. you can watch all "after words" programs on our web site, booktv.org. >> c-span, where history unfolds daily. in 1979, c-span was created as a public service by america's cable television companies and is brought to you today by your cable or satellite provider.
9:01 am
and most of the world expected but hoped it wouldn't happen but it did and that launched world war

6 Views

info Stream Only

Uploaded by TV Archive on